logo
#

Latest news with #data infrastructure

The Strategic Value Of Data Infrastructure In R&D
The Strategic Value Of Data Infrastructure In R&D

Forbes

time3 days ago

  • Business
  • Forbes

The Strategic Value Of Data Infrastructure In R&D

Akshay Talekar, Lead Data Scientist at UL Research Institutes. At many organizations, I've observed, research and development (R&D) is often viewed as the driver of discovery and competitiveness. Many organizations do heavily invest in scientific talent and instrumentation. But in my experience, leaders overlook a fundamental pillar of R&D: data infrastructure. Data infrastructure is the system of tools, technologies and processes that enable organizations to gather, store, manage and access data efficiently and securely. In today's digital R&D environment, a robust, flexible and well-integrated approach to data infrastructure is a strategic asset that can help organizations accelerate discovery, reduce time to insight and build seamless collaboration across teams. Modern R&D efforts usually generate massive amounts of raw data from high-throughput experiments, theoretical simulations and IoT sensor outputs. But raw data alone doesn't create much value. To ensure that raw data is strategically leveraged, organizations should implement strong infrastructure to capture, store, standardize and analyze results. With strategic data infrastructure, teams can move from reactive to proactive discovery. Searchable, interoperable and contextualized data enables scientists to uncover trends across experiments, optimize designs in real time and drive outcomes (including with the use of AI). This turns R&D efforts from trial-and-error into a system of scalable innovation. Important Considerations For R&D Data Infrastructure To implement strategic R&D data infrastructure, leaders need to keep four important considerations in mind. The first consideration? They should have architecture that scales with discovery, given that R&D is inherently exploratory, and data types can change rapidly. Flexible architecture is often a hybrid of cloud and on-premises systems. It's essential for supporting a variety of structured and unstructured data. Investing in modular systems and clear data governance frameworks allows organizations to grow without re-architecting from scratch every time they want to add new equipment. Leaders should also prioritize integration across silos. I've found that discovery often happens at the intersection of domains. But unfortunately, I've observed that many R&D groups operate in silos with fragmented tools and datasets, which can lead to missed opportunities. Strategic infrastructure connects instruments, lab information management systems (LIMS), synthesis and characterization equipment, and modeling platforms to create a unified data backbone. This enables interdisciplinary insights and can accelerate the handoff from lab to commercialization. When leaders are deciding which tools to implement, they should think beyond just storage. They should also consider how tools can contribute to R&D teams forming insights. Data infrastructure goes beyond simply where data lives—it also determines how data gets used. Integrating analytics, AI and machine learning tools and visualization dashboards into user-friendly platforms are ways that leaders can empower researchers at their organizations to ask better questions and get faster answers. When done right, data infrastructure becomes an active collaborator in the discovery process. Then there's security, compliance and reproducibility. Data infrastructure must enable traceability, version control and audit readiness, especially in regulated industries. But even outside of industries such as pharmaceuticals and defense, reproducible science is good science. A secure and well-documented data pipeline can facilitate trust, lab safety and long-term protection of intellectual property. How To Get Started Leaders don't need to overhaul everything at once to build a robust R&D data infrastructure. They can start by identifying and addressing critical pain points at their organizations, such as fragmented data sources, inaccessible results or repeated manual workflows. Then, they can prioritize the use cases where having better data flow would yield value. From there, I recommend that leaders take several steps. First, they should map their data ecosystems, taking stock of the existing tools, platforms and data flows across their teams. Second, they should engage cross-functional stakeholders. It's vital to bring scientists, as well as IT operations team members, to the table early so everyone can align on needs and constraints. Leaders should also invest in modular and interoperable solutions—they should choose tools that can scale and adapt as R&D priorities evolve. Finally, they should create data governance plans for their organizations. Data governance plans establish standards for data forms, metadata, data ownership and access controls. Strategic Benefits Of Data Infrastructure Beyond The Lab The benefits of data infrastructure go beyond the lab. Data infrastructure in R&D enables organizations to generate new insights more seamlessly, lowers the likelihood of the duplication of effort and supports better decision-making. Additionally, data infrastructure future-proofs organizations. Those with strong data foundations are better positioned to adopt emerging technologies, from AI to autonomous self-driving labs. In my view, leaders should treat R&D data infrastructure with the same strategic weight as enterprise IT or cybersecurity. This means budgeting for integration, talent and tooling along with equipment. It means aligning data architecture with scientific goals, not just items on an IT checklist. And more importantly, it means recognizing that in the knowledge economy, the organizations that manage their data best will win. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Data limitations hindering AI adoption: insights from financial services
Data limitations hindering AI adoption: insights from financial services

Fast Company

time05-08-2025

  • Business
  • Fast Company

Data limitations hindering AI adoption: insights from financial services

You've likely heard the adage that 'You can't have an AI strategy without a data strategy.' But what does that mean in practice? As someone who regularly explores this question with data leaders, I've gained insights into the challenges organizations face when implementing AI and how they're overcoming them. Enterprise data management is a daunting task, even before factoring in the levels of accuracy, completeness, and consistency required to bring reliable AI applications to market. And challenges are widespread. According to a survey by EY, only 36% of senior leaders report that their organizations are investing fully in data infrastructure—covering quality, accessibility, and governance of data—to support AI initiatives at scale. Another EY survey found that 67% of leaders say data infrastructure is holding them back. I, too, am no stranger to how data limitations impact enterprise initiatives. Inaccurate and incomplete data impacts AI-driven marketing campaigns—from misdirecting target marketing efforts to potentially perpetuating societal bias. However, I've found that the financial services industry has been setting a great example on how to embrace data readiness, paving the way for efficient AI adoption. The conversations reveal a common understanding: Successful AI implementation doesn't depend on good technology; it requires strong data strategies and judgment to know when and where to deploy AI effectively. FINANCIAL SERVICES MAKING DATA 'AI-READY' Data readiness is often misunderstood as a technical checklist, but in reality, it requires a fundamental shift in how we think about data. Historically, financial services have relied on data labels and tags that seemed self-evidently correct for reporting purposes. However, AI demands a deeper level of scrutiny—moving beyond surface accuracy to ensure that the data output truly reflects the nuances needed for machine learning models to perform effectively. One of the most striking examples I've encountered involved a financial services company's three-year effort to build a discriminative AI model. Despite having what they thought was a well-labeled dataset, early attempts resulted in poor accuracy. It wasn't until consultation with their algorithm team that they uncovered a crucial gap: The data labels were accurate for reporting purposes, but failed to account for variability in market conditions and trade parameters. To address this, the team applied techniques like principal component analysis (PCA) and interquartile range (IQR) filtering to reduce noise in their data. They also created new features specifically designed to filter training datasets, identifying and isolating trades that fit typical implementation patterns. These methods transformed data from being merely correct to truly AI-ready—fit for the purpose of building reliable models. This example highlights the importance of viewing data as more than static resources. Instead, it's something to be actively curated and continuously refined. AI readiness isn't about having a perfect dataset; it's about understanding the imperfections and compensating for them through thoughtful design and ongoing iteration. TACKLING DATA GAPS WITH TRANSPARENCY AND COLLABORATION Addressing data gaps is rarely a solitary effort. Many leaders I've spoken to have embraced transparency and collaboration, which are invaluable mechanisms for enhancing data quality and driving meaningful AI outcomes. This approach is one that we should all adopt as we embark on our AI journeys. In my experience, the most successful organizations have implemented transparency frameworks around AI initiatives, and they typically center around three pillars: 1. Data Disclosure Leading firms openly share what data was used in a model and, more importantly, what data they wish they had. Being upfront about these gaps can lead to valuable feedback from clients and colleagues, helping to identify areas where additional data collection or adjustments are needed. 2. Feature Transparency Forward-thinking organizations disclose both the features used in their models as well as those that were intentionally excluded. This approach sparks valuable discussions with stakeholders who may even identify features that hadn't been considered, leading to significant model improvements. 3. Model Selection Rationale AI trailblazers explain their reasoning behind the models they use, whether it's an XGBoost, random forest, or a less transparent neural network. This clarity builds trust with both internal teams and external clients, ensuring they feel included in the process rather than sidelined by it. When implemented, these principles help foster a culture of openness that addresses skepticism head-on. Organizations embracing these approaches can see more success as they create an environment where data quality issues are identified early and addressed collaboratively. This structured transparency makes data more accessible and understandable across the enterprise, bridging the gap between technical teams and business stakeholders. A FRAMEWORK FOR THOUGHTFUL AI DEPLOYMENT When it comes to deploying AI, whether in financial services or another industry, one of the most important insights I've gleaned is that just because you can build a model doesn't mean you should deploy it. Thoughtful AI deployment isn't about rushing to apply AI to every problem—it's about making purposeful decisions that consider both the value AI can add and its associated risks. As one senior data leader emphasized in a recent conversation, this starts with a clear understanding of the context in which a model will be used. They shared how their team once spent months manually tagging data to ensure a generative AI model could provide accurate and contextualized responses. This process was labor-intensive but critical to building trust in the model's outputs, particularly in a regulated industry like finance, where mistakes carry significant reputational and compliance risks. Another key consideration is knowing when AI is not the right tool for the job. Sometimes, a simpler solution, like a Tableau dashboard or a regression model, is a better fit. Being transparent about where your organization chooses not to deploy AI—and why—is just as important as highlighting successes. This openness builds trust, both within your organization and with clients. AI'S FUTURE IN FINANCIAL SERVICES Ultimately, deploying AI thoughtfully requires good judgment, and that judgment must be guided by the principle that AI should enhance human decision-making rather than replace it. Financial services are on the fast track to adoption, and by focusing on thoughtful deployment, transparency, and collaboration, we can unlock AI's potential while ensuring it remains a tool to empower human decision-making.

How Data Infrastructure Quietly Drives The AI Engine
How Data Infrastructure Quietly Drives The AI Engine

Forbes

time01-08-2025

  • Business
  • Forbes

How Data Infrastructure Quietly Drives The AI Engine

Michel Tricot is Cofounder and CEO of Airbyte. AI cannot operate effectively without context, and that requires being fed by reliable data flows. This is where I've watched most AI initiatives fall apart. Companies get excited about the potential of AI technology, but they forget about the unglamorous work of building data infrastructure. You can't just plug an AI model into your existing systems and expect magic to happen. Reliable data pipelines aren't just nice to have; they are the foundation that makes everything else possible. You can't build a city without roads, plumbing or electricity, especially if they are not integrated in a well-organized central system. The same is true for AI applications. To make reasoned decisions, AI technology must have the full context and information, and that can only happen by bringing all the right data together. This is why Salesforce is spending $8 billion to acquire Informatica, a legacy mainstay company in data integration technology. While this may not seem like an AI play, it is shaping up to be an essential part of Salesforce's AI strategy. Salesforce's Agentforce aims to transform how businesses use AI agents to handle everyday operations. But there is a catch that I've seen trip up countless companies: AI agents can't function effectively without access to complete, well-organized data. Most enterprises still struggle with fragmented systems—data trapped in silos, scattered across platforms that don't communicate—making it difficult for AI to operate with full context. This moment highlights a broader shift in enterprise AI strategy. Success increasingly hinges not on the model itself, but on the ecosystem that supports it. Forward-looking companies are beginning to realize that AI agents are only as effective as the infrastructure surrounding them, particularly the ability to unify and orchestrate data from across the organization. Rather than treating data integration as a separate IT concern, the most strategic moves we're seeing involve embedding it directly into AI planning. This helps reduce friction and accelerate deployment while enabling AI to work with complete, real-time context. However, it also introduces a new kind of trade-off: the more seamless the stack, the more locked-in an organization may become. Businesses must now weigh the benefits of speed and simplicity against the risks of vendor dependency and long-term flexibility. I've always believed that open-source software fuels innovation, lets you customize and expand your tools, and most importantly, helps avoid being tied down to a single vendor. The Salesforce-Informatica deal marks a big change in how we think about AI in businesses. It confirms something I've long evangelized: the real advantage doesn't come from AI models themselves, since everyone will have access to similar ones. Instead, the true strength lies in how well a company can organize, protect and use its own unique data to power AI. To do this, I recommend that organizations build their infrastructure to be scalable, portable and sovereign. Leverage open source and flexible data formats for your destinations, and ensure sensitive data is under your control, not being hosted and parsed by external parties. This blend of flexibility and sovereignty ultimately offers the best approach for modern data management. The companies that understand this will be the ones positioned to successfully transform their businesses with AI. Those that don't risk building impressive demos that stall when faced with the complexity of real-world data environments. Preparing for a data movement now can help businesses ensure they're ready to build scalable, effective AI applications. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

WeBank Technology Services Showcases Shenzhen-Hong Kong Cross-Boundary Data Validation Platform at Data Summit 2025
WeBank Technology Services Showcases Shenzhen-Hong Kong Cross-Boundary Data Validation Platform at Data Summit 2025

Yahoo

time29-07-2025

  • Business
  • Yahoo

WeBank Technology Services Showcases Shenzhen-Hong Kong Cross-Boundary Data Validation Platform at Data Summit 2025

HONG KONG, July 28, 2025 /PRNewswire/ -- The Hong Kong Monetary Authority (HKMA) hosted its 2025 Data Summit on Monday, drawing more than 800 representatives from the city's various sectors. HKMA Chief Executive Eddie Yue delivered a keynote address, emphasizing the city's commitment to financial infrastructure innovation. During the Data Summit, the Shenzhen-Hong Kong Cross-Boundary Data Validation Platform (DVP), co-operated by WeBank Technology Services, was also introduced as the first cross-boundary data verification service connected to the HKMA' s Commercial Data Interchange (CDI). CDl, a key pillar of the HKMA's "Fintech 2025" strategy, aims at enhancing financial inclusion through secure and efficient data sharing. Cross-boundary data bridge built on blockchain principles Developed under the guidance of the HKMA, Shenzhen Municipal Cyberspace Administration, the Hong Kong and Macao Affairs Office of Shenzhen Municipal People's Government, Shenzhen Municipal Financial Regulatory Bureau, the Authority of Qianhai Shenzhen-Hong Kong Modern Service Industry Cooperation Zone of Shenzhen, the Shenzhen Branch of the People's Bank of China, and the Shenzhen Regulatory Bureau of National Financial Regulatory Administration, the DVP positions itself as a next-generation cross-boundary data infrastructure. The platform enables trusted validation of personal and corporate data through hash-based verification — without transferring nor storing any original data files. Powered by immutability and traceability features of blockchain technology, the platform supports data portability in compliance with regulatory requirements from both jurisdictions. The DVP collaborates with authoritative data providers in the Chinese Mainland across various sectors like finance and public services, allowing for verification of documents such as personal and corporate credit reports, bank statements, and enterprise credit information among others. The service is designed to meet growing demands for cross-boundary financial services by residents and enterprises operating between Shenzhen and Hong Kong. CDI-DVP integration to help streamline access for Hong Kong banks CDI, launched under the HKMA's Fintech 2025 blueprint, serves as a foundational financial data infrastructure that lowers the cost and complexity of data exchange between banks and commercial data providers. With the integration of DVP, Hong Kong banks can now — via a compliant, secure and efficient channel — improve risk assessment and the overall user experience of cross-boundary financial services. During the summit, Eddie Yue visited WeBank Technology Services' exhibition booth, where Huiya Yao, Head of Fintech Innovation at WeBank, shared the latest developments of the DVP. To date, the DVP has already served more than 10 business entities and is being used in various scenarios including cross-boundary credit assessment and financing. Early gains from DVP adoption At a panel session focused on cross-boundary data validation, representatives from Fusion Bank and ICBC (Asia) shared early outcomes from leveraging the DVP. Andy Li, Head of Corporate Banking at Fusion Bank, said: "Hong Kong serves as a pivotal gateway connecting the Mainland and international markets, and remains the preferred destination for Mainland enterprises going global. Fusion Bank has been committed to supporting Mainland SMEs in their overseas expansion and has approved over HKD100 million in loans for Hong Kong affiliates of GBA SMEs by leveraging the DVP for cross-boundary verification of credit information. Moving ahead, we will continue to utilize DVP to deliver more convenient financing solutions for Hong Kong affiliates of GBA enterprises, supporting the advancement of financial inclusion in the GBA." ICBC Asia's Co-Head of Data Management Youping Song revealed plans to leverage the DVP as well to verify customers' Central Bank Individual credit reports and other data types, to enhance credit status assessment work for newly arrived Hong Kong residents from the Chinese Mainland, supporting talent schemes such as the "Top Talent Pass Scheme(TTPS)" and "Quality Migrant Admission Scheme (QMAS)". Toward a Greater Bay Area data ecosystem The DVP, officially launched in May 2024, is jointly operated by the China (Qianhai) Internet Exchange (CNIX), Shenzhen Credit Service Co. Ltd and WeBank Technology Services. WeBank provides the overall architecture design and technical support of the platform. Looking ahead, Huiya Yao shared that the DVP aims to integrate more diverse data sources, deepening data connectivity across the Greater Bay Area. About WeBank Technology Services Launched in Hong Kong in June 2024, WeBank Technology Services sets out to leverage WeBank's cutting-edge fintech capabilities and digital finance best practices to deliver a variety of superior digital finance and digital infrastructure solutions to digital banks, financial institutions, government agencies, and industry partners worldwide. View original content to download multimedia: SOURCE WeBank Co. Ltd. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Who is Astronomer CEO Andy Byron?
Who is Astronomer CEO Andy Byron?

The Sun

time17-07-2025

  • Entertainment
  • The Sun

Who is Astronomer CEO Andy Byron?

TECH tycoon Andy Byron is the head of one of the fastest-growing companies in data infrastructure. He was thrust into the spotlight – literally – when Coldplay's Chris Martin accidentally exposed him seemingly having an affair with a colleague. 4 4 Who is Andy Byron? Andy Byron is the CEO of Astronomer, a software development company reported to be worth more than $1.3 billion (£1 billion). He has been in the position since July 2023, according to his LinkedIn page. Before joining Astronomer, Byron held leadership roles in several technology-focused ventures, bringing a track record in "scaling enterprise tools and building teams across global markets". He was relatively unknown outside the tech sector... at least until this week. On Wednesday, a "kiss cam" camera at a Coldplay gig panned on to married Byron and his companion – Kristin Cabot – prompting frontman Chris Martin to say: "Oh look at these two." Footage on TikTok shows the moment Byron and Cabot, pictured above, rushed to shield their faces as they flashed up on a big screen draped in each other's arms at the show in Boston, Massachusetts. The mortifying footage from Wednesday night's concert has gone viral on social media – having been shared thousands of times on X and TikTok. Who is Kristin Cabot? Cabot has been Astronomer's Chief People Officer for nine months. The HR staffer – who has the second surname Thornby in brackets on her LinkedIn – boasts on her page that she wins "trust with employees of all levels, from CEOs to managers to assistants". 4 4

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store